942 resultados para Deterministic filtering


Relevância:

70.00% 70.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nesta dissertação são analisados métodos de localização baseados na rede, com destaque para os métodos de correlação de assinaturas de rádio-frequência (DCM - Database Correlation Methods). Métodos baseados na rede não requerem modificações nos terminais móveis (MS - Mobile Stations), sendo portanto capazes de estimar a localização de MS legados, i.e., sem suporte específico a posicionamento. Esta característica, associada a alta disponibilidade e precisão dos métodos DCM, torna-os candidatos viáveis para diversas aplicações baseadas em posição, e em particular para a localização de chamadas para números de emergência - polícia, defesa civil, corpo de bombeiros, etc. - originadas de telefones móveis celulares. Duas técnicas para diminuição do tempo médio para produção de uma estimativa de posição são formuladas: a filtragem determinística e a busca otimizada utilizando algoritmos genéticos. Uma modificação é realizada nas funções de avaliação utilizadas em métodos DCM, inserindo um fator representando a inacurácia intrínseca às medidas de nível de sinal realizadas pelos MS. As modificações propostas são avaliadas experimentalmente em redes de telefonia móvel celular de segunda e terceira gerações em ambientes urbanos e suburbanos, assim como em redes locais sem fio em ambiente indoor. A viabilidade da utilização de bancos de dados de correlação (CDB - Correlation Database) construídos a partir de modelagem de propagação é analisada, bem como o efeito da calibração de modelos de propagação empíricos na precisão de métodos DCM. Um dos métodos DCM propostos, utilizando um CDB calibrado, teve um desempenho superior ao de vários outros métodos DCM publicados na literatura, atingindo em área urbana a precisão exigida dos métodos baseados na rede pela regulamentação FCC (Federal Communications Commission) para o serviço E911 (Enhanced 911 ).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We recast the reconstruction problem of diffuse optical tomography (DOT) in a pseudo-dynamical framework and develop a method to recover the optical parameters using particle filters, i.e., stochastic filters based on Monte Carlo simulations. In particular, we have implemented two such filters, viz., the bootstrap (BS) filter and the Gaussian-sum (GS) filter and employed them to recover optical absorption coefficient distribution from both numerically simulated and experimentally generated photon fluence data. Using either indicator functions or compactly supported continuous kernels to represent the unknown property distribution within the inhomogeneous inclusions, we have drastically reduced the number of parameters to be recovered and thus brought the overall computation time to within reasonable limits. Even though the GS filter outperformed the BS filter in terms of accuracy of reconstruction, both gave fairly accurate recovery of the height, radius, and location of the inclusions. Since the present filtering algorithms do not use derivatives, we could demonstrate accurate contrast recovery even in the middle of the object where the usual deterministic algorithms perform poorly owing to the poor sensitivity of measurement of the parameters. Consistent with the fact that the DOT recovery, being ill posed, admits multiple solutions, both the filters gave solutions that were verified to be admissible by the closeness of the data computed through them to the data used in the filtering step (either numerically simulated or experimentally generated). (C) 2011 Optical Society of America

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the pioneering work of Gibson in 1950, Shape- From-Texture has been considered by researchers as a hard problem, mainly due to restrictive assumptions which often limit its applicability. We assume a very general stochastic homogeneity and perspective camera model, for both deterministic and stochastic textures. A multi-scale distortion is efficiently estimated with a previously presented method based on Fourier analysis and Gabor filters. The novel 3D reconstruction method that we propose applies to general shapes, and includes non-developable and extensive surfaces. Our algorithm is accurate, robust and compares favorably to the present state of the art of Shape-From- Texture. Results show its application to non-invasively study shape changes with laid-on textures, while rendering and retexturing of cloth is suggested for future work. © 2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Ecologists are debating the relative role of deterministic and stochastic determinants of community structure. Although the high diversity and strong spatial structure of soil animal assemblages could provide ecologists with an ideal ecological scenario, surprisingly little information is available on these assemblages.
2. We studied species-rich soil oribatid mite assemblages from a Mediterranean beech forest and a grassland. We applied multivariate regression approaches and analysed spatial autocorrelation at multiple spatial scales using Moran's eigenvectors. Results were used to partition community variance in terms of the amount of variation uniquely accounted for by environmental correlates (e.g. organic matter) and geographical position. Estimated neutral diversity and immigration parameters were also applied to a soil animal group for the first time to simulate patterns of community dissimilarity expected under neutrality, thereby testing neutral predictions.
3. After accounting for spatial autocorrelation, the correlation between community structure and key environmental parameters disappeared: about 40% of community variation consisted of spatial patterns independent of measured environmental variables such as organic matter. Environmentally independent spatial patterns encompassed the entire range of scales accounted for by the sampling design (from tens of cm to 100 m). This spatial variation could be due to either unmeasured but spatially structured variables or stochastic drift mediated by dispersal. Observed levels of community dissimilarity were significantly different from those predicted by neutral models.
4. Oribatid mite assemblages are dominated by processes involving both deterministic and stochastic components and operating at multiple scales. Spatial patterns independent of the measured environmental variables are a prominent feature of the targeted assemblages, but patterns of community dissimilarity do not match neutral predictions. This suggests that either niche-mediated competition or environmental filtering or both are contributing to the core structure of the community. This study indicates new lines of investigation for understanding the mechanisms that determine the signature of the deterministic component of animal community assembly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently Distributed Denial of Service (DDoS) attacks have been identified as one of the most serious problems on the Internet. The aim of DDoS attacks is to prevent legitimate users from accessing desired resources, such as network bandwidth. Hence the immediate task of DDoS defense is to provide as much resources as possible to legitimate users when there is an attack. Unfortunately most current defense approaches can not efficiently detect and filter out attack traffic. Our approach is to find the network anomalies by using neural network, deploy the system at distributed routers, identify the attack packets, and then filter them. The marks in the IP header that are generated by a group of IP traceback schemes, Deterministic Packet Marking (DPM)/Flexible Deterministic Packet Marking (FDPM), assist this process of identifying attack packets. The experimental results show that this approach can be used to defend against both intensive and subtle DDoS attacks, and can catch DDoS attacks’ characteristic of starting from multiple sources to a single victim. According to results, we find the marks in IP headers can enhance the sensitivity and accuracy of detection, thus improve the legitimate traffic throughput and reduce attack traffic throughput. Therefore, it can perform well in filtering DDoS attack traffic precisely and effectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet Protocol (IP) traceback is the enabling technology to control Internet crime. In this paper, we present a novel and practical IP traceback system called Flexible Deterministic Packet Marking (FDPM) which provides a defense system with the ability to find out the real sources of attacking packets that traverse through the network. While a number of other traceback schemes exist, FDPM provides innovative features to trace the source of IP packets and can obtain better tracing capability than others. In particular, FDPM adopts a flexible mark length strategy to make it compatible to different network environments; it also adaptively changes its marking rate according to the load of the participating router by a flexible flow-based marking scheme. Evaluations on both simulation and real system implementation demonstrate that FDPM requires a moderately small number of packets to complete the traceback process; add little additional load to routers and can trace a large number of sources in one traceback process with low false positive rates. The built-in overload prevention mechanism makes this system capable of achieving a satisfactory traceback result even when the router is heavily loaded. The motivation of this traceback system is from DDoS defense. It has been used to not only trace DDoS attacking packets but also enhance filtering attacking traffic. It has a wide array of applications for other security systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to present a new geometric model based on the mathematical morphology paradigm, specialized to provide determinism to the classic morphological operations. The determinism is needed to model dynamic processes that require an order of application, as is the case for designing and manufacturing objects in CAD/CAM environments. Design/methodology/approach – The basic trajectory-based operation is the basis of the proposed morphological specialization. This operation allows the definition of morphological operators that obtain sequentially ordered sets of points from the boundary of the target objects, inexistent determinism in the classical morphological paradigm. From this basic operation, the complete set of morphological operators is redefined, incorporating the concept of boundary and determinism: trajectory-based erosion and dilation, and other morphological filtering operations. Findings – This new morphological framework allows the definition of complex three-dimensional objects, providing arithmetical support to generating machining trajectories, one of the most complex problems currently occurring in CAD/CAM. Originality/value – The model proposes the integration of the processes of design and manufacture, so that it avoids the problems of accuracy and integrity that present other classic geometric models that divide these processes in two phases. Furthermore, the morphological operative is based on points sets, so the geometric data structures and the operations are intrinsically simple and efficient. Another important value that no excessive computational resources are needed, because only the points in the boundary are processed.